-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Headers (outbound and inbound) do not comply with KafkaJS type definitions #45
Comments
I can workaround this on the consumer side: const result: IHeaders = {};
for (const header of Object.values(headers)) {
for (const [key, value] of Object.entries(header as unknown as IHeaders)) {
if (!value) continue;
const existing = result[key];
result[key] = existing ? isArray(existing) ? [...existing, value] : [existing, value] : value;
}
}
return result; However, I'm unable to determine a headers format for the producer side that results in correctly delivered headers. |
Thanks for filing this. There were a bunch of issues with conversions for the headers within JS and within C++ also. I've fixed those on the development branch. These are OK to use immediately and will match the IHeaders interface. Additionally, there is a PR #39 improving the typing support. I've added it within the examples/typescript/kafkajs.ts too for reference. headers: {
'header1': 'value1',
'header2': [Buffer.from('value2'), 'value3']
} |
Closing this as I released 0.1.15-devel with the fixes, given example works too |
Hi @milindl, I appears this issue was fully resolved for The original example in this ticket, which uses {
"header1": {
"type": "Buffer",
"data": [
97,
108,
112,
104,
97
]
},
"header2": {
"type": "Buffer",
"data": [
98,
101,
116,
97
]
}
} In import {KafkaJS as Confluent} from "@confluentinc/kafka-javascript";
const topic = "test-confluent-topic";
let receivedCount = 0;
const kafka = new Confluent.Kafka({kafkaJS: {brokers: ["localhost:9092"]}});
const consumer = kafka.consumer({kafkaJS: {groupId: `${topic}-group`}});
await consumer.connect();
await consumer.subscribe({topic});
await consumer.run({
eachBatch: async ({batch}) => {
for (const message of batch.messages) {
log.info(JSON.stringify(message.headers, null, 2));
receivedCount++;
}
}
});
await until(async () => consumer.assignment().length > 0);
const producer = kafka.producer({"linger.ms": 0});
await producer.connect();
await producer.send({
topic,
messages: [{value: "one", headers: {header1: "alpha", header2: "beta"}}]
});
await until(async () => receivedCount == 1);
await producer.disconnect();
await consumer.disconnect(); {
"0": {
"header1": {
"type": "Buffer",
"data": [
97,
108,
112,
104,
97
]
}
},
"1": {
"header2": {
"type": "Buffer",
"data": [
98,
101,
116,
97
]
}
}
} |
Can confirm the same as I'm actively investigating a port of behavior now. WIll open a new issue to highlight correspondingly since this was closed already. |
The header implementation for both producers and consumers does not comply with the type definitions offered up in kafkajs.d.ts (which are unmodified from the KafkaJS originals).
Below is a comparison between KafkaJS and Confluent.
KafkaJS
Confluent
Two (maybe three) notable issues:
header1=alpha
andheader2=beta
were sent to Kafka askey=header1
andkey=header2
key=header1
andkey=header2
, KafkaJS compatibility would dictate a string key of"key"
and a string[] value of["header1","header2"]
The text was updated successfully, but these errors were encountered: