8000 Issue with ERR_BUFFER_OUT_OF_BOUNDS · Issue #1374 · tulios/kafkajs · GitHub
[go: up one dir, main page]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with ERR_BUFFER_OUT_OF_BOUNDS #1374

Open
johnnytrile opened this issue May 26, 2022 · 15 comments
Open

Issue with ERR_BUFFER_OUT_OF_BOUNDS #1374

johnnytrile opened this issue May 26, 2022 · 15 comments

Comments

@johnnytrile
Copy link
johnnytrile commented May 26, 2022

I'm facing an issue related to Kafka,my current service using kafkaJS v1.15.0, written in Nodejs.

Here's the log error the server return.

{"level":"ERROR","timestamp":"2022-05-26T04:26:22.024Z","logger":"kafkajs","message":"[Consumer] Crash: KafkaJSNumberOfRetriesExceeded: Attempt to access memory outside buffer bounds","groupId":"my-group","retryCount":0,"stack":"KafkaJSNonRetriableError\n Caused by: RangeError [ERR_BUFFER_OUT_OF_BOUNDS]: Attempt to access memory outside buffer bounds\n at boundsError (internal/buffer.js:70:11)\n at Buffer.readInt32BE (internal/buffer.js:468:5)\n at Decoder.readInt32 (/var/services/bo-consumer-v2/node_modules/kafkajs/src/protocol/decoder.js:50:31)\n 8000 at module.exports (/var/services/bo-consumer-v2/node_modules/kafkajs/src/protocol/recordBatch/v0/decoder.js:43:40)\n at decodeMessages (/var/services/bo-consumer-v2/node_modules/kafkajs/src/protocol/requests/fetch/v4/decodeMessages.js:27:35)\n at runMicrotasks (<anonymous>)\n at processTicksAndRejections (internal/process/task_queues.js:94:5)\n at decodePartition (/var/services/bo-consumer-v2/node_modules/kafkajs/src/protocol/requests/fetch/v11/response.js:39:13)\n at Decoder.readArrayAsync (/var/services/bo-consumer-v2/node_modules/kafkajs/src/protocol/decoder.js:179:18)\n at decodeResponse (/var/services/bo-consumer-v2/node_modules/kafkajs/src/protocol/requests/fetch/v11/response.js:44:15)"}

I debugged into the library and realized if length == 0 then it throws an error
telegram-cloud-photo-size-5-6086631887738089566-y

Has anyone faced or know how to solve this problem?

@sumanake
Copy link

I have had the same problem 🥲

@Nevon
Copy link
Collaborator
Nevon commented May 31, 2022

Can you reproduce this in 2.0.1? If yes, please share the raw buffer. The easiest way to get it is probably to put a breakpoint here: https://github.com/tulios/kafkajs/blob/master/src/protocol/requests/fetch/v11/response.js#L48 and then copy the value of rawBuffer. Alternatively, you can run the program with the environment variables KAFKAJS_LOG_LEVEL=debug KAFKAJS_DEBUG_PROTOCOL_BUFFERS=1 KAFKAJS_DEBUG_EXTENDED_PROTOCOL_BUFFERS=1 to log the full Fetch response buffer.

Note that this will include the message data, so only share it if it doesn't contain any sensitive data.

@johnnytrile
Copy link
Author

Can you reproduce this in 2.0.1? If yes, please share the raw buffer. The easiest way to get it is probably to put a breakpoint here: https://github.com/tulios/kafkajs/blob/master/src/protocol/requests/fetch/v11/response.js#L48 and then copy the value of rawBuffer. Alternatively, you can run the program with the environment variables KAFKAJS_LOG_LEVEL=debug KAFKAJS_DEBUG_PROTOCOL_BUFFERS=1 KAFKAJS_DEBUG_EXTENDED_PROTOCOL_BUFFERS=1 to log the full Fetch response buffer.

Note that this will include the message data, so only share it if it doesn't contain any sensitive data.

Yes , i've changed version to 2.0.1 and debug into the path you sent.
and got the rawBuffer
I couldn't upload the file so I posted it directly on google drive, I hid some sensitive information, please review it
https://drive.google.com/file/d/1RhK5-q-vVpzLBHYP8pDso68JZ9xpq9MD/view?usp=sharing

@HuynhVinhRikin
Copy link

interesting problem, waiting for the experts

8000

@jld-adriano
Copy link
jld-adriano commented Jun 7, 2022

Same issue on 2.0.2
kafka 3.1.0

@TheMadKow
Copy link

Encountered this issue with 2.2.0, when reverting to 1.16.0, it works just fine.

@fdhuseynov
Copy link

Have the same issue with 1.16.0

@alex-lt-kong
Copy link
alex-lt-kong commented Jan 11, 2023

Kafka v3.3.1,
Node.js v18.12.1

Have a similar issue (i.e., ERR_BUFFER_OUT_OF_BOUNDS with v2.2.3) but the stack trace is not exactly the same as below:

    throw new ERR_BUFFER_OUT_OF_BOUNDS();
    ^

RangeError [ERR_BUFFER_OUT_OF_BOUNDS]: Attempt to access memory outside buffer bounds
    at new NodeError (node:internal/errors:393:5)
    at boundsError (node:internal/buffer:84:11)
    at Buffer.readInt32BE (node:internal/buffer:482:5)
    at Decoder.readInt32 (<my-project-dir>/src/node_modules/kafkajs/src/protocol/decoder.js:52:31)
    at Connection.processData (<my-project-dir>/src/node_modules/kafkajs/src/network/connection.js:526:38)
    at Socket.onData (<my-project-dir>/src/node_modules/kafkajs/src/network/connection.js:184:14)
    at Socket.emit (node:events:513:28)
    at addChunk (node:internal/streams/readable:324:12)
    at readableAddChunk (node:internal/streams/readable:297:9)
    at Readable.push (node:internal/streams/readable:234:10) {
  code: 'ERR_BUFFER_OUT_OF_BOUNDS'
}

@tareqGhosh
Copy link

Did anyone find any solution for this?

@ryan65
Copy link
ryan65 commented Jul 9, 2023

Anyone has a solution. ? Not sure if it is caused by my consumer or producer .
I cant really reproduce it happens very rarely , usually during some kind of network issue.
It always happens with the same stack trace processData - connection.js line 429
const correlationId = response.readInt32()

Actually a temporary solution I am hoping to find is to at least catch this exception gracefully (not via process.on('uncaughtExeption).)
A workaround I am also looking for is is there any event I can hook on to in the consumer/producer itself. to catch this exception myself.
I checked the event.CRASH event but I see it doesnt catch it.
but maybe there is some other way to catch this within the producer/consumer context . some kind of handler I can pass.

@ryan65
Copy link
ryan65 commented Jul 10, 2023

Hi
Looking at the code , I can see that the ONLY way we can get this exception is if somehow the data is corrupted or invalid.
and if expectedResponseSize < 4 somehow due to invalid data in the buffers. (connection.js)
const expectedResponseSize = decoder.readInt32()

if expectedResponseSize < 4 the code will continue the flow but then we will reach the line 429
const correlationId = response.readInt32()
and since the buffer size < 4 it will crash.

Soo .... I think there should be some data integrity validation to make sure we don't get an invalid expectedResponseSize value.
In this case , not sure how to handle but for sure we must close the connection or something since the stream has lost its integrity.

@mj123muskan
Copy link
mj123muskan commented Aug 23, 2023

Hi,

I have created a simple kafka consumer using kafkajs. Getting the same ERR_BUFFER_OUT_OF_BOUNDS. I want to create a simple consumer which fetches data from kafka.

Kafka Consumer code:-

consumer.ts

import 'dotenv/config';
import { Kafka, type KafkaMessage, logLevel } from 'kafkajs';
import { nanoid } from 'nanoid';

const brokers = process.env.KAFKA_BROKERS
  ? process.env.KAFKA_BROKERS.split(',')
  : ['localhost:9092'];

const topic = process.env.KAFKA_TOPIC ? process.env.KAFKA_TOPIC : 'breeze-logs';

let isConnected = false;

if (brokers.length === 0) {
  console.log('Kafka brokers not found');
}

if (topic === '') {
  console.log('Kafka topic not found');
}

const kafka = new Kafka({
  brokers: brokers,
  logLevel: logLevel.DEBUG,
  clientId: nanoid()
});

const consumer = kafka.consumer({ groupId: 'my-group', maxBytesPerPartition: 1024 * 1024 });

consumer.on('consumer.connect', () => {
  isConnected = true;
  console.log('kafka consumer connected');

});

consumer.on('consumer.disconnect', () => {
  isConnected = false;
  console.log('Kafka consumer disconnected');
});

consumer.on('consumer.crash', () => {
  console.log('Kafka consumer Crashed');
});

export async function handleMessage(message: KafkaMessage){
  console.log(`Kafka consumer received message: ${message.value}`);
};

if (!isConnected) {
  try {
    await consumer.connect();
  } catch (e) {
    console.log('Kafka consumer error connecting the consumer: ', String(e));
  }
}

export async function runKafkaConsumer() {
  try {
    await consumer.subscribe({ topic });
    await consumer.run({
      eachMessage: async ({ message }) => {
        handleMessage(message);
      }
    });
  } catch (e) {
    console.error('Kafka consumer error running consumer:', String(e));
  }
}

running kafka consumer in hooks

try{
  runKafkaConsumer();
}catch(e){
  console.log("running kafka consumer", String(e));
}

can someone please help with the same

@karimHsayni
Copy link

in v1.16.0 it works correctly

@fakelag
Copy link
fakelag commented Feb 6, 2025

Running a typescript project along with kafkajs & kafkajs-snappy. Setting "esModuleInterop": true in tsconfig fixed the issue. There could be some kind of import compatibility issue going on, I did not yet investigate deeper.

kafkajs 2.2.4
kafkajs-snappy 1.1.0
nodejs 20.13.1
typescript 5.7.3

@Gabryellows
Copy link

i have the same issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

0