Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ejercicio2 #4

Open
wants to merge 7 commits into
base: master
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
-m"Trying cherry pick"-m"Index,overview,spaces"
(cherry picked from commit 791c92c)
  • Loading branch information
esthertrigo committed Jan 31, 2022
commit 8c45d1b403553f7a18f7647f0a5b2cd5ee67e7ed
57 changes: 30 additions & 27 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ ___

KLoadGen is Kafka load generator plugin for JMeter designed to work with AVRO and JSON schemas. It allows sending kafka messages with a structure defined as an AVRO schema or a Json schema. It connects to the Scheme Registry Server, retrieves the subject to send and generates a random message every time.

## Contents
## Index

* [Getting Started](#getting-started)
* [Producer setup](#producer-setup)
Expand All @@ -16,36 +16,39 @@ KLoadGen is Kafka load generator plugin for JMeter designed to work with AVRO an
* [StandAlone execution](#standalone-execution)
* [Special Thanks](#special-thanks)

## General information
## Overview

KLoadGen includes eight main components:

* **Kafka Schema Sampler** - This jmeter java sampler sends messages to kafka, it uses the value and key configuration and generates a data matching that definition.
* **Kafka Schema Sampler**: This jmeter java sampler sends messages to kafka, it uses the value and key configuration and generates a data matching that definition.

* **Kafka Consumer Sampler** - This jmeter java sampler reads messages from kafka, it uses the value and key configuration to deserialize read messages.
* **Kafka Consumer Sampler**: This jmeter java sampler reads messages from kafka, it uses the value and key configuration to deserialize read messages.

* **Kafka Headers Config** - This jmeter config element generates serialized object messages based on input class and its property configurations.
* **Kafka Headers Config**: This jmeter config element generates serialized object messages based on input class and its property configurations.

* **Value Serialized Config** - This jmeter config element generates plain text messages based on input schema template designed.
* **Value Serialized Config**: This jmeter config element generates plain text messages based on input schema template designed.

* **Value File Serialized Config** - This jmeter config element allows to upload a value schema file instead to get it from the Schema Registry.
* **Value File Serialized Config**: This jmeter config element allows to upload a value schema file instead to get it from the Schema
Registry.

* **Value Deserialized Config** - This jmeter config element allows you to define how the value of a message is deserialized.
* **Value Deserialized Config**: This jmeter config element allows you to define how the value of a message is
deserialized.

* **Value Deserialized Config** - This jmeter config element allows to upload a value schema file to deserialize messages.
* **Value Deserialized Config**: This jmeter config element allows to upload a value schema file to deserialize
messages.

* **Schema Registry Config** - This jmeter config element allows to configure the connection to a Schema Registry, security access,....
* **Schema Registry Config**: This jmeter config element allows to configure the connection to a Schema Registry, security access,....

* **Key Serialized Config** - This jmeter config allows to configure a Key Schema from a Schema Registry
* **Key Serialized Config**: This jmeter config allows to configure a Key Schema from a Schema Registry

* **Key File Serialized Config** - This jmeter config allows to upload a key schema file instead to get it from the Schema Registry
* **Key File Serialized Config**: This jmeter config allows to upload a key schema file instead to get it from the Schema Registry

* **Key Deserialized Config** - This jmeter config element allows you to define how the key of a message is
* **Key Deserialized Config**: This jmeter config element allows you to define how the key of a message is
deserialized.

* **Key File Deserialized Config** - This jmeter config allows to upload a key schema to deserialize message key.
* **Key File Deserialized Config**: This jmeter config allows to upload a key schema to deserialize message key.

* **Key Simple Config** - This jmeter config allows to define a simple basic key to send into de message.
* **Key Simple Config**: This jmeter config allows to define a simple basic key to send into de message.

### Setup

Expand Down Expand Up @@ -103,19 +106,19 @@ Once the build is completed, copy target/kloadgen-plugin-<version>.jar fil

### KLoadGenSampler

* **bootstrap.servers** : broker-ip-1:port, broker-ip-2:port, broker-ip-3:port
* **zookeeper.servers** : zookeeper-ip-1:port, zookeeper-ip-2:port, zookeeper-ip-3:port. _Optional_
* **Kafka.topic.name** : Topic on which messages will be sent
* **keyed.message** : Enable adding a Key to the messages
* **compression.type** : Kafka producer compression type(none/gzip/snappy/lz4)
* **batch.size** : messages batch size(increased batch size with compression like lz4 gives better throughput)
* **linger.ms** : How much maximum time producer should wait till batch becomes full(should be 5-10 when increased batch size and compression is enabled)
* **buffer.memory** : Total buffer memory for producer.
* **bootstrap.servers**: broker-ip-1:port, broker-ip-2:port, broker-ip-3:port
* **zookeeper.servers**: zookeeper-ip-1:port, zookeeper-ip-2:port, zookeeper-ip-3:port. _Optional_
* **Kafka.topic.name**: Topic on which messages will be sent
* **keyed.message**: Enable adding a Key to the messages
* **compression.type**: Kafka producer compression type(none/gzip/snappy/lz4)
* **batch.size**: messages batch size(increased batch size with compression like lz4 gives better throughput)
* **linger.ms**: How much maximum time producer should wait till batch becomes full(should be 5-10 when increased batch size and compression is enabled)
* **buffer.memory**: Total buffer memory for producer.
* **acks** : Message sent acknowledgement, value can be (0/1/-1).
* **send.buffer.bytes** : The size of the TCP send buffer (SO_SNDBUF) to use when sending data. If the value is -1, the OS default will be used.
* **receive.buffer.bytes** : The size of the TCP receive buffer (SO_RCVBUF) to use when reading data. If the value is -1, the OS default will be used.
* **security.protocol** : Kafka producer protocol. Valid values are: PLAINTEXT, SSL, SASL_PLAINTEXT, SASL_SSL.
* **message.placeholder.key** : Config element message variable name. This name should be same as message placeholder key in serialized/plaintext config element.
* **send.buffer.bytes**: The size of the TCP send buffer (SO_SNDBUF) to use when sending data. If the value is -1, the OS default will be used.
* **receive.buffer.bytes**: The size of the TCP receive buffer (SO_RCVBUF) to use when reading data. If the value is -1, the OS default will be used.
* **security.protocol**: Kafka producer protocol. Valid values are: PLAINTEXT, SSL, SASL_PLAINTEXT, SASL_SSL.
* **message.placeholder.key**: Config element message variable name. This name should be same as message placeholder key in serialized/plaintext config element.
* **kerberos.auth.enabled** : YES/NO if it is disabled all below properties will be ignored
* **java.security.auth.login.config** : jaas.conf of Kafka Kerberos
* **java.security.krb5.conf** : Kerberos server krb5.conf file
Expand Down