Skip to content

Commit

Permalink
MergeIterator benchmark: add more realistic sizes
Browse files Browse the repository at this point in the history
At 15-second scrape intervals a chunk covers 30 minutes, so 1,000 chunks
is about three weeks, a highly un-representative test.

Instant queries, such as those done by the ruler, will only fetch one
chunk from each ingester.

Signed-off-by: Bryan Boreham <[email protected]>
  • Loading branch information
bboreham committed Jul 6, 2021
1 parent 47def0d commit aacbbd5
Showing 1 changed file with 4 additions and 0 deletions.
4 changes: 4 additions & 0 deletions pkg/querier/batch/batch_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,10 @@ func BenchmarkNewChunkMergeIterator_CreateAndIterate(b *testing.B) {
{numChunks: 1000, numSamplesPerChunk: 100, duplicationFactor: 3, enc: promchunk.DoubleDelta},
{numChunks: 1000, numSamplesPerChunk: 100, duplicationFactor: 1, enc: promchunk.PrometheusXorChunk},
{numChunks: 1000, numSamplesPerChunk: 100, duplicationFactor: 3, enc: promchunk.PrometheusXorChunk},
{numChunks: 100, numSamplesPerChunk: 100, duplicationFactor: 1, enc: promchunk.PrometheusXorChunk},
{numChunks: 100, numSamplesPerChunk: 100, duplicationFactor: 3, enc: promchunk.PrometheusXorChunk},
{numChunks: 1, numSamplesPerChunk: 100, duplicationFactor: 1, enc: promchunk.PrometheusXorChunk},
{numChunks: 1, numSamplesPerChunk: 100, duplicationFactor: 3, enc: promchunk.PrometheusXorChunk},
}

for _, scenario := range scenarios {
Expand Down

0 comments on commit aacbbd5

Please sign in to comment.