-
-
Notifications
You must be signed in to change notification settings - Fork 544
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue with ERR_BUFFER_OUT_OF_BOUNDS #1374
Comments
I have had the same problem 🥲 |
Can you reproduce this in 2.0.1? If yes, please share the raw buffer. The easiest way to get it is probably to put a breakpoint here: https://github.com/tulios/kafkajs/blob/master/src/protocol/requests/fetch/v11/response.js#L48 and then copy the value of Note that this will include the message data, so only share it if it doesn't contain any sensitive data. |
Yes , i've changed version to 2.0.1 and debug into the path you sent. |
interesting problem, waiting for the experts |
Same issue on 2.0.2 |
Encountered this issue with 2.2.0, when reverting to 1.16.0, it works just fine. |
Have the same issue with 1.16.0 |
Kafka v3.3.1, Have a similar issue (i.e., ERR_BUFFER_OUT_OF_BOUNDS with v2.2.3) but the stack trace is not exactly the same as below:
|
Did anyone find any solution for this? |
Anyone has a solution. ? Not sure if it is caused by my consumer or producer . Actually a temporary solution I am hoping to find is to at least catch this exception gracefully (not via process.on('uncaughtExeption).) |
Hi if expectedResponseSize < 4 the code will continue the flow but then we will reach the line 429 Soo .... I think there should be some data integrity validation to make sure we don't get an invalid expectedResponseSize value. |
Hi, I have created a simple kafka consumer using kafkajs. Getting the same ERR_BUFFER_OUT_OF_BOUNDS. I want to create a simple consumer which fetches data from kafka. Kafka Consumer code:- consumer.ts import 'dotenv/config';
import { Kafka, type KafkaMessage, logLevel } from 'kafkajs';
import { nanoid } from 'nanoid';
const brokers = process.env.KAFKA_BROKERS
? process.env.KAFKA_BROKERS.split(',')
: ['localhost:9092'];
const topic = process.env.KAFKA_TOPIC ? process.env.KAFKA_TOPIC : 'breeze-logs';
let isConnected = false;
if (brokers.length === 0) {
console.log('Kafka brokers not found');
}
if (topic === '') {
console.log('Kafka topic not found');
}
const kafka = new Kafka({
brokers: brokers,
logLevel: logLevel.DEBUG,
clientId: nanoid()
});
const consumer = kafka.consumer({ groupId: 'my-group', maxBytesPerPartition: 1024 * 1024 });
consumer.on('consumer.connect', () => {
isConnected = true;
console.log('kafka consumer connected');
});
consumer.on('consumer.disconnect', () => {
isConnected = false;
console.log('Kafka consumer disconnected');
});
consumer.on('consumer.crash', () => {
console.log('Kafka consumer Crashed');
});
export async function handleMessage(message: KafkaMessage){
console.log(`Kafka consumer received message: ${message.value}`);
};
if (!isConnected) {
try {
await consumer.connect();
} catch (e) {
console.log('Kafka consumer error connecting the consumer: ', String(e));
}
}
export async function runKafkaConsumer() {
try {
await consumer.subscribe({ topic });
await consumer.run({
eachMessage: async ({ message }) => {
handleMessage(message);
}
});
} catch (e) {
console.error('Kafka consumer error running consumer:', String(e));
}
} running kafka consumer in hooks try{
runKafkaConsumer();
}catch(e){
console.log("running kafka consumer", String(e));
} can someone please help with the same |
in v1.16.0 it works correctly |
Running a typescript project along with kafkajs & kafkajs-snappy. Setting kafkajs 2.2.4 |
i have the same issue |
I'm facing an issue related to Kafka,my current service using kafkaJS v1.15.0, written in Nodejs.
Here's the log error the server return.
{"level":"ERROR","timestamp":"2022-05-26T04:26:22.024Z","logger":"kafkajs","message":"[Consumer] Crash: KafkaJSNumberOfRetriesExceeded: Attempt to access memory outside buffer bounds","groupId":"my-group","retryCount":0,"stack":"KafkaJSNonRetriableError\n Caused by: RangeError [ERR_BUFFER_OUT_OF_BOUNDS]: Attempt to access memory outside buffer bounds\n at boundsError (internal/buffer.js:70:11)\n at Buffer.readInt32BE (internal/buffer.js:468:5)\n at Decoder.readInt32 (/var/services/bo-consumer-v2/node_modules/kafkajs/src/protocol/decoder.js:50:31)\n 8000 at module.exports (/var/services/bo-consumer-v2/node_modules/kafkajs/src/protocol/recordBatch/v0/decoder.js:43:40)\n at decodeMessages (/var/services/bo-consumer-v2/node_modules/kafkajs/src/protocol/requests/fetch/v4/decodeMessages.js:27:35)\n at runMicrotasks (<anonymous>)\n at processTicksAndRejections (internal/process/task_queues.js:94:5)\n at decodePartition (/var/services/bo-consumer-v2/node_modules/kafkajs/src/protocol/requests/fetch/v11/response.js:39:13)\n at Decoder.readArrayAsync (/var/services/bo-consumer-v2/node_modules/kafkajs/src/protocol/decoder.js:179:18)\n at decodeResponse (/var/services/bo-consumer-v2/node_modules/kafkajs/src/protocol/requests/fetch/v11/response.js:44:15)"}
I debugged into the library and realized if length == 0 then it throws an error

Has anyone faced or know how to solve this problem?
The text was updated successfully, but these errors were encountered: