Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Collector API datastore #1703

Merged
merged 2 commits into from
Oct 3, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/site/content/en/docs/Concepts/core-concepts/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ A `Function` is a JavaScript function that is executed on the server side. Funct

## Datasource

A `Datasource` is a **required** top-level organizational construct that defines the source of the data to be stored or retrieved by Horreum. Currently, Horreum supports 2 types of `Datasource`: Postgres and Elasticsearch
A `Datasource` is a **required** top-level organizational construct that defines the source of the data to be stored or retrieved by Horreum. Currently, Horreum supports 3 types of `Datasource`: Postgres, Elasticsearch, and [Collector](https://github.com/Karm/collector)

## Baseline

Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
43 changes: 43 additions & 0 deletions docs/site/content/en/docs/Integrations/collector/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
---
title: Collector API
date: 2024-10-01
description: Use Collector API to query JSON for analysis
categories: [Integration, Datasource]
weight: 1
---

If you have a lot of data already stored in a [Collector](https://github.com/Karm/collector) instance, you can query it and analyze the data for regressions in Horreum.

## Configuration

To configure a test to use the `Collector API` backend, you need to be a team administrator. With the correct permissions, you can:

1. Generate a new API key for the `Collector API` backend: Please see the collector docs on how to [Create a new API token](https://github.com/Karm/collector?tab=readme-ov-file#create-a-new-api-token)
2. Navigate to `Administration` -> `Datastores` configuration page, e.g. `http://localhost:8080/admin#datastores`
2. Select the `Team` from the `Team` dropdown that you wish to configure
2. Click `New Datastore`
{{% imgproc new-datastore Fit "1115x469" %}}
New Datastore
{{% /imgproc %}}
3. Configure the `Collector API` Datastore:
{{% imgproc modal Fit "1115x469" %}}
New Collector API Datastore
{{% /imgproc %}}
1. Select `Collector API` from the `Datastore Type` dropdown
2. Provide a `Name` for the Datastore
3. Enter the `URL` for the Collector instance
4. Enter the `API Key` for the Collector instance, generated in step 1
5. Click `Save`

## Test Configuration

To configure a test to use the `Collector API` backend, you can:

1. Navigate to a test configuration page, e.g. `http://localhost:8080/test/10`
2. Select the `Collector API` backend defined in the `Datastores` configuration from the `Datastore` dropdown

{{% imgproc configure-test Fit "1115x469" %}}
Configure Test
{{% /imgproc %}}

3. Click `Save`
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
63 changes: 63 additions & 0 deletions docs/site/content/en/docs/Tutorials/query-collector/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
---
title: Query Collector API
date: 2024-10-01
description: Query JSON data from Collector and analyze it with Horreum
categories: [Tutorial]
weight: 3
---

> **Prerequisites**:

> 1. Horreum is running, and you are logged in

> 2. You have access to a running [Collector](https://github.com/Karm/collector) instance that already contains JSON data

> 3. You have previously defined a `Schema` for the JSON data you wish to analyze, please see [Define a Schema](/docs/tasks/define-schema-and-views/)

## Create a Test and query data from a Collector instance

This tutorial will guide you through how to connect to a remote [Collector](https://github.com/Karm/collector) instance, and perform change detection on existing data in an index.

## Configure Collector Datastore

Please follow the [Collector Integration](/docs/integrations/collector/) guide to configure a new Collector Datastore.

## Query Data from Collector

The procedure is the same as described in the [Upload your first Run](/docs/tutorials/create-test-run/) tutorial

To query data from a Collector instance, you need to know the `tag` and `imgName` of the data you wish to analyze.
You will also need to determine the date range of the data you wish to analyze using `newerThan` and `olderThan`.

```json
{
"tag": "quarkus-main-ci",
"imgName": "quarkus-integration-test-main-999-SNAPSHOT-runner",
"newerThan": "2024-09-20 00:00:00.000",
"olderThan": "2024-09-25 00:00:00.000"
}
```

where;

- **tag**: the tag of the data you wish to analyze
- **imgName**: the image name (aka test) of the data you wish to analyze
- **newerThan**: the start date of the data you wish to analyze
- **olderThan**: the end date of the data you wish to analyze

The query can be executed by making a call to the Horreum API;

```bash
$ curl 'http://localhost:8080/api/run/data?test='$TEST'&start='$START'&stop='$STOP'&owner='$OWNER'&access='$ACCESS \
-s -H 'content-type: application/json' -H 'Authorization: Bearer '$TOKEN \
-d @/tmp/collector_query.json
```

The query will return a list of `RunID`'s for each json object retrieved and analyzed from Collector.

## What Next?

After successfully querying data from Collector, you can now:
- optionally [Transform Runs to Datasets](/docs/tasks/trasnform-runs-to-datasets/) to transform the data into datasets
- [Configure Change Detection](/docs/tasks/configure-change-detection/) to detect regressions in the data.
- [Configure Actions](/docs/tasks/configure-actions/) to trigger events when regressions are detected.
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ weight: 3

## Create a Test and query data from an Elasticsearch instance

This tutorial will present guide you through how to connect to a remote Elasticsearch instance, and perform change detection on existing data in an index.
This tutorial will guide you through how to connect to a remote Elasticsearch instance, and perform change detection on existing data in an index.

## Configure Elasticsearch Datastore

Expand Down
19 changes: 19 additions & 0 deletions docs/site/content/en/openapi/openapi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2524,6 +2524,23 @@ components:
- RELATIVE_DIFFERENCE
- EDIVISIVE
type: string
CollectorApiDatastoreConfig:
description: Type of backend datastore
required:
- builtIn
- apiKey
- url
type: object
properties:
builtIn:
description: Built In
type: boolean
apiKey:
description: Collector API KEY
type: string
url:
description: "Collector url, e.g. https://collector.foci.life/api/v1/image-stats"
type: string
ComparisonResult:
description: Result of performing a Comparison
type: object
Expand Down Expand Up @@ -2844,6 +2861,7 @@ components:
enum:
- POSTGRES
- ELASTICSEARCH
- COLLECTORAPI
type: string
example: ELASTICSEARCH
DatastoreTestResponse:
Expand All @@ -2858,6 +2876,7 @@ components:
enum:
- POSTGRES
- ELASTICSEARCH
- COLLECTORAPI
type: string
example: ELASTICSEARCH
EDivisiveDetectionConfig:
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
package io.hyperfoil.tools.horreum.api.data.datastore;

import org.eclipse.microprofile.openapi.annotations.enums.SchemaType;
import org.eclipse.microprofile.openapi.annotations.media.Schema;

@Schema(type = SchemaType.OBJECT, required = true, description = "Type of backend datastore")
public class CollectorApiDatastoreConfig extends BaseDatastoreConfig {

public CollectorApiDatastoreConfig() {
super(false);
}

@Schema(type = SchemaType.STRING, required = true, description = "Collector API KEY")
public String apiKey;

@Schema(type = SchemaType.STRING, required = true, description = "Collector url, e.g. https://collector.foci.life/api/v1/image-stats")
public String url;

@Override
public String validateConfig() {
if ("".equals(apiKey)) {
return "apiKey must be set";
}
if ("".equals(url)) {
return "url must be set";
}

return null;
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@ public enum DatastoreType {
POSTGRES("POSTGRES", new TypeReference<PostgresDatastoreConfig>() {
}),
ELASTICSEARCH("ELASTICSEARCH", new TypeReference<ElasticsearchDatastoreConfig>() {
}),
COLLECTORAPI("COLLECTORAPI", new TypeReference<CollectorApiDatastoreConfig>() {
});

private static final DatastoreType[] VALUES = values();
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,169 @@
package io.hyperfoil.tools.horreum.datastore;

import java.io.IOException;
import java.net.URI;
import java.net.http.HttpClient;
import java.net.http.HttpRequest;
import java.net.http.HttpResponse;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
import java.time.format.DateTimeParseException;
import java.util.Arrays;
import java.util.Optional;

import jakarta.enterprise.context.ApplicationScoped;
import jakarta.inject.Inject;
import jakarta.ws.rs.BadRequestException;
import jakarta.ws.rs.core.Response;

import org.jboss.logging.Logger;

import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.node.ObjectNode;

import io.hyperfoil.tools.horreum.api.data.datastore.CollectorApiDatastoreConfig;
import io.hyperfoil.tools.horreum.api.data.datastore.DatastoreType;
import io.hyperfoil.tools.horreum.entity.backend.DatastoreConfigDAO;
import io.hyperfoil.tools.horreum.svc.ServiceException;

@ApplicationScoped
public class CollectorApiDatastore implements Datastore {

protected static final Logger log = Logger.getLogger(CollectorApiDatastore.class);

@Inject
ObjectMapper mapper;

@Override
public DatastoreResponse handleRun(JsonNode payload,
JsonNode metaData,
DatastoreConfigDAO configuration,
Optional<String> schemaUriOptional,
ObjectMapper mapper)
throws BadRequestException {

if (metaData != null) {
log.warn("Empty request: " + metaData);
throw ServiceException.badRequest("Empty request: " + metaData);
}
metaData = payload;

final CollectorApiDatastoreConfig jsonDatastoreConfig = getCollectorApiDatastoreConfig(configuration, mapper);

HttpClient client = HttpClient.newHttpClient();
try {
String tag = payload.get("tag").asText();
String imgName = payload.get("imgName").asText();
String newerThan = payload.get("newerThan").asText().replace(" ", "%20"); // Handle spaces in dates
String olderThan = payload.get("olderThan").asText().replace(" ", "%20");

verifyPayload(mapper, jsonDatastoreConfig, client, tag, newerThan, olderThan);

URI uri = URI.create(jsonDatastoreConfig.url
+ "?tag=" + tag
+ "&imgName=" + imgName
+ "&newerThan=" + newerThan
+ "&olderThan=" + olderThan);
HttpRequest.Builder builder = HttpRequest.newBuilder().uri(uri);
builder.header("Content-Type", "application/json")
.header("token", jsonDatastoreConfig.apiKey);
HttpRequest request = builder.build();
HttpResponse<String> response = client.send(request, HttpResponse.BodyHandlers.ofString());
if (response.statusCode() != Response.Status.OK.getStatusCode()) {
log.error("Collector API returned " + response.statusCode() + " body : " + response.body());
throw ServiceException
.serverError("Collector API returned " + response.statusCode() + " body : " + response.body());
}

payload = mapper.readTree(response.body());
return new DatastoreResponse(payload, metaData);
} catch (JsonProcessingException e) {
log.error("Error while parsing responde from collector API ", e);
throw ServiceException.serverError("Error while sending request to collector API");
} catch (IOException | InterruptedException e) {
log.error("Error while sending request to collector API", e);
throw ServiceException.serverError("Error while sending request to collector API");
}
}

private static void verifyPayload(ObjectMapper mapper, CollectorApiDatastoreConfig jsonDatastoreConfig,
HttpClient client, String tag, String newerThan, String olderThan)
throws IOException, InterruptedException {
// Verify that the tag is in the distinct list of tags
URI tagsUri = URI.create(jsonDatastoreConfig.url + "/tags/distinct");
HttpRequest.Builder tagsBuilder = HttpRequest.newBuilder().uri(tagsUri);
HttpRequest tagsRequest = tagsBuilder
.header("Content-Type", "application/json")
.header("token", jsonDatastoreConfig.apiKey).build();
HttpResponse<String> response = client.send(tagsRequest, HttpResponse.BodyHandlers.ofString());
String[] distinctTags;
try {
distinctTags = mapper.readValue(response.body(), String[].class);
} catch (JsonProcessingException e) {
log.error("Error while parsing response from collector API: " + response.body(), e);
throw ServiceException.badRequest("Error while parsing response from collector API " + response.body());
}
if (distinctTags == null || distinctTags.length == 0) {
log.warn("No tags found in collector API");
throw ServiceException.badRequest("No tags found in collector API");
}
if (Arrays.stream(distinctTags).noneMatch(tag::equals)) {
String tags = String.join(",", distinctTags);
throw ServiceException.badRequest("Tag not found in list of distinct tags: " + tags);
}
// Verify that the dates format is correct
final String DATE_FORMAT = "yyyy-MM-dd%20HH:mm:ss.SSS";
final DateTimeFormatter DATE_FORMATTER = DateTimeFormatter.ofPattern(DATE_FORMAT);
try {
final LocalDateTime oldest = LocalDateTime.parse(newerThan, DATE_FORMATTER);
final LocalDateTime newest = LocalDateTime.parse(olderThan, DATE_FORMATTER);
if (oldest.isAfter(newest)) {
throw ServiceException.badRequest(
"newerThan must be before olderThan (newerThan=" + newerThan + " olderThan=" + olderThan + ")");
}
} catch (DateTimeParseException e) {
throw ServiceException.badRequest(
"Invalid date format (" + newerThan + ", " + olderThan + "). Dates must be in the format " + DATE_FORMAT);
}
}

private static CollectorApiDatastoreConfig getCollectorApiDatastoreConfig(DatastoreConfigDAO configuration,
ObjectMapper mapper) {
final CollectorApiDatastoreConfig jsonDatastoreConfig;
try {
jsonDatastoreConfig = mapper.treeToValue(configuration.configuration,
CollectorApiDatastoreConfig.class);
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
if (jsonDatastoreConfig == null) {
log.error("Could not find collector API datastore: " + configuration.name);
throw ServiceException.serverError("Could not find CollectorAPI datastore: " + configuration.name);
}
assert jsonDatastoreConfig.apiKey != null : "API key must be set";
assert jsonDatastoreConfig.url != null : "URL must be set";
return jsonDatastoreConfig;
}

@Override
public DatastoreType type() {
return DatastoreType.COLLECTORAPI;
}

@Override
public UploadType uploadType() {
return UploadType.MUILTI;
}

@Override
public String validateConfig(Object config) {
try {
return mapper.treeToValue((ObjectNode) config, CollectorApiDatastoreConfig.class).validateConfig();
} catch (JsonProcessingException e) {
return "Unable to read configuration. if the problem persists, please contact a system administrator";
}
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -572,7 +572,6 @@ Response addRunFromData(String start, String stop, String test,
Optional.ofNullable(schemaUri), mapper);

List<Integer> runIds = new ArrayList<>();
String responseString;
if (datastore.uploadType() == Datastore.UploadType.MUILTI
&& response.payload instanceof ArrayNode) {

Expand Down
Loading