[go: up one dir, main page]

0% found this document useful (0 votes)
6 views31 pages

Node Interview

The document provides a comprehensive guide on Node.js interview preparation, covering beginner to advanced questions and answers. It explains key concepts such as the Node.js runtime environment, event loop, npm, middleware, error handling, routing, and environment variable management. Additionally, it discusses advanced topics like clustering and worker threads to enhance performance in Node.js applications.

Uploaded by

Saurabh Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views31 pages

Node Interview

The document provides a comprehensive guide on Node.js interview preparation, covering beginner to advanced questions and answers. It explains key concepts such as the Node.js runtime environment, event loop, npm, middleware, error handling, routing, and environment variable management. Additionally, it discusses advanced topics like clustering and worker threads to enhance performance in Node.js applications.

Uploaded by

Saurabh Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

28/05/2025, 15:05 Node.

js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery

Beginner Node.js
interview questions

#1. What is Node.js, and why is it


used?
Node.js is a runtime environment that allows you to
run JavaScript outside the browser.

Traditionally, JavaScript was limited to frontend


tasks, but Node.js expanded its use to backend
development, enabling developers to build the
entire stack of an application using one language -
JavaScript.

Another key feature of Node.js is its non-blocking,


event-driven architecture. This design allows it to
handle multiple tasks simultaneously, such as
processing user requests or fetching data from a
database, without waiting for one task to finish.

Because of its ease of use, Node.js is widely used for


everything from e-commerce applications, RESTful
APIs, and IoT projects.

#2. What is the Node.js event


loop?
The event loop is the core mechanism that enables
Node.js to handle multiple tasks efficiently on a
single thread.

When you perform an operation like reading a file,


Node.js doesn’t wait for the task to complete.
Instead, it delegates the task to the operating system
and moves on to handle other tasks in the queue.
Once the task finishes, the event loop picks up the
result and executes the associated callback function.

This asynchronous, non-blocking approach is what


makes Node.js highly scalable and efficient,
especially for I/O-intensive tasks like serving multiple
users or processing API requests.

https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 3/20
28/05/2025, 15:05 Node.js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery

#3. What are npm and


package.json ?
npm, short for Node Package Manager, is a tool for
managing dependencies in your Node.js projects. It
allows you to install, update, and remove libraries
(called packages) with ease, saving you time and
effort when adding functionality to your applications.

The package.json file serves as the blueprint for


your project. It includes essential details like the
project name, version, dependencies, and scripts for
automating tasks like starting your app or running
tests. For example, installing a library like Express
using npm automatically updates your
package.json file to track the dependency.

Together, npm and package.json streamline


development and ensure consistency across
environments.

#4. How do you create a basic


Academy Testimonials Resources Community SIGN IN JOIN ZERO TO MASTERY
HTTP server in Node.js?
Node.js makes it simple to create an HTTP server
using the built-in http module.

For example

const http = require('http');

const server = http.createServer


res.writeHead(200, { 'Content-
res.end('Hello, World!');
});

server.listen(3000, () => {
console.log('Server is running
});

Here’s what happens:

The http.createServer method creates the


server

https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 4/20
28/05/2025, 15:05 Node.js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery

A callback function handles incoming requests


and sends responses

The listen method specifies the port (3000)


where the server will run

This example demonstrates the foundation of a


Node.js web server, which you can expand with
routing, middleware, or database integration.

#5. What are the differences


between require() and
import ?
Both require() and import are used to include
code from other files or libraries, but they belong to
different module systems.

require()

Part of CommonJS, the default module system in


Node.js

Synchronous and works in all Node.js versions


without additional configuration

For example, using require() :

const fs = require('fs');

import :

Part of ES6 modules, offering a modern and


concise syntax

Asynchronous and requires enabling ES modules


by adding "type": "module" to your
package.json

For example

import fs from 'fs';

In modern projects, import is preferred for its


cleaner syntax and consistency with the broader

https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 5/20
28/05/2025, 15:05 Node.js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery

JavaScript ecosystem, while require() remains


common in older codebases.

#6. What is the fs module, and


how do synchronous and
asynchronous file operations
work in Node.js?
The fs module provides tools to interact with the
file system, such as reading, writing, or deleting files
and directories.

Here’s an example of reading a file asynchronously:

const fs = require('fs');

fs.readFile('example.txt', 'utf8
if (err) {
console.error(err);
return;
}
console.log(data);
});

For synchronous operations:

const data = fs.readFileSync('ex


console.log(data);

Synchronous Methods: Block the rest of the


application while completing the operation

Asynchronous Methods: Use callbacks or


promises to keep the application responsive

Asynchronous methods are generally preferred for


scalability, except during initialization tasks like
loading configurations.

#7. What are modules in Node.js,


and how do you use them?

https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 6/20
28/05/2025, 15:05 Node.js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery

Modules in Node.js are reusable blocks of code that


help organize functionality into smaller, manageable
pieces.

There are three types of modules:

1. Core Modules: Built into Node.js (e.g., fs , http ,


path )

2. Local Modules: Custom modules you create within


your project

3. Third-Party Modules: Installed via npm (e.g.,


Express)

Here’s an example of a local module.

math.js

function add(a, b) {
return a + b;
}

module.exports = add;

In app.js :

const add = require('./math');


console.log(add(2, 3)); // Outpu

Modules promote reusability, maintainability, and


separation of concerns in your code.

#8. What are streams in Node.js,


and how do they work?
Streams in Node.js process data piece-by-piece,
making them memory-efficient for handling large
datasets. Instead of loading everything into memory
at once, streams process data in chunks.

There are four types of streams:

1. Readable: For reading data (e.g., file or API)

2. Writable: For writing data (e.g., to a file or


network)
https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 7/20
28/05/2025, 15:05 Node.js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery

3. Duplex: For both reading and writing (e.g., network


sockets)

4. Transform: For modifying data as it flows (e.g.,


compression)

For example (Readable Stream):

const fs = require('fs');
const readableStream = fs.create

readableStream.on('data', (chunk
console.log('Chunk received:',
});

readableStream.on('end', () => {
console.log('File reading comp
});

Streams are essential for tasks like processing large


files, streaming video, or handling real-time data.

Intermediate Node.js
interview questions

#9. What is middleware in


Node.js, and how is it used in
Express?
Middleware in Node.js is a function that has access to
the request and response objects, as well as the
next() function. It’s commonly used in Express to
handle tasks like logging, authentication, error
handling, and parsing incoming requests.

Here’s an example of a simple logging middleware:

const express = require('express


const app = express();

app.use((req, res, next) => {


console.log(`${req.method} req
next(); // Pass control to the
});
https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 8/20
28/05/2025, 15:05 Node.js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery

app.get('/', (req, res) => {


res.send('Hello, World!');
});

app.listen(3000, () => console.l

In this example:

Middleware logs the HTTP method and URL for


every request

Middleware functions execute in the order they’re


defined, making them flexible for building modular
and reusable application features

#10. How do you handle errors in


Node.js?
Error handling is essential in Node.js, especially since
many operations are asynchronous. There are several
common patterns:

Using Callbacks

Many asynchronous methods accept a callback with


an err parameter.

fs.readFile('file.txt', 'utf8',
if (err) {
console.error('Error reading
return;
}
console.log(data);
});

Using Promises

Promises handle errors with .catch() .

fs.promises.readFile('file.txt',
.then((data) => console.log(da
.catch((err) => console.error(

https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 9/20
28/05/2025, 15:05 Node.js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery

Using try...catch with Async/Await

Async/await provides a clean way to handle errors.

async function readFile() {


try {
const data = await fs.promis
console.log(data);
} catch (err) {
console.error('Error:', err.
}
}
readFile();

Each method is suited to different scenarios, but


async/await is preferred in modern applications for
its readability.

#11. How do you implement


routing in a Node.js application?
Routing defines how an application responds to HTTP
requests for specific endpoints (URLs) and HTTP
methods (GET, POST, etc.).

In Node.js, you can handle routing with the built-in


http module, but using a framework like Express
significantly simplifies the process.

For example (Using Express):

const express = require('express


const app = express();

// Define routes
app.get('/', (req, res) => {
res.send('Welcome to the homep
});

app.get('/about', (req, res) =>


res.send('This is the about pa
});

app.post('/submit', (req, res) =


res.send('Form submitted!');
});

https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 10/20
28/05/2025, 15:05 Node.js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery

// Start the server


app.listen(3000, () => {
console.log('Server running on
});

How it works:

1. Routes are defined for specific HTTP methods and


endpoints, such as GET / or POST /submit

2. Each route includes a handler function that


specifies how the server responds. For example,
the GET /about route sends a message: "This is
the about page"

3. The listen method starts the server, making it


accessible on the specified port

Why use Express for routing?

Simplified Syntax: Express reduces boilerplate


code, making routes easy to define and manage

Middleware Integration: Middleware can be


attached to routes for tasks like authentication,
logging, or validation

Scalability: For larger applications, you can group


related routes into separate modules, keeping
your codebase organized and maintainable

#12. How do you manage


environment variables in
Node.js?
Environment variables store configuration details,
such as database credentials or API keys, outside
your codebase. This makes your application more
secure and flexible across environments like
development, testing, and production.

Using dotenv (Most common method)


The dotenv package is the most widely used way to
manage environment variables in Node.js.

https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 11/20
28/05/2025, 15:05 Node.js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery

Here’s how to use it:

Install the dotenv package:

npm install dotenv

Create a .env file:

DB_HOST=localhost
DB_USER=root
DB_PASS=securepassword

Load variables in your application:

require('dotenv').config();

const dbHost = process.env.DB_HO


console.log(`Connecting to datab

This approach ensures sensitive information isn’t


hardcoded and makes deployment across different
environments seamless.

Alternative method: Using Node.js


20.6.0+ Native Support

Starting with Node.js 20.6.0, you can load


environment variables from a .env file without
installing any additional packages.

Simply run your Node.js application with the --env-


file flag:

node --env-file=.env app.js

This command automatically loads the variables


defined in the .env file into process.env .

Which method should you use?

It depends:

https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 12/20
28/05/2025, 15:05 Node.js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery

The dotenv package remains the most common


choice because it works across all versions of
Node.js and offers flexibility for advanced use
cases

The built-in support is great for simple setups, but it


lacks features like handling multiple environment
files.

Advanced Node.js
interview questions
This section covers more complex topics, focusing
on optimization, scalability, and advanced concepts
in Node.js.

#13. What is clustering in


Node.js, and how does it
improve performance?
Clustering in Node.js allows you to create multiple
instances of your application to take advantage of
multi-core processors. By default, Node.js runs on a
single thread, but clustering enables the workload to
be distributed across multiple CPU cores.

Here’s an example:

const cluster = require('cluster


const http = require('http');
const os = require('os');

if (cluster.isMaster) {
const numCPUs = os.cpus().leng
for (let i = 0; i < numCPUs; i
cluster.fork(); // Create a
}
} else {
http.createServer((req, res) =
res.writeHead(200);
res.end('Hello, World!');
}).listen(3000);
}

https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 13/20
28/05/2025, 15:05 Node.js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery

In this setup:

The master process forks a worker process for


each CPU core

Each worker handles incoming requests,


distributing the load and improving performance

Clustering is ideal for CPU-intensive tasks and high-


traffic applications.

#14. What are worker threads in


Node.js, and when should you
use them?
Worker threads allow you to run JavaScript code in
parallel threads, which is useful for CPU-intensive
tasks. Unlike child processes, worker threads share
memory with the main thread, making them more
efficient for tasks requiring shared state.

Example using worker threads:

const { Worker } = require('work

if (isMainThread) {
const worker = new Worker('./w
worker.on('message', (msg) =>
} else {
parentPort.postMessage('Hello
}

We use worker threads for computationally expensive


tasks, like algorithms or data processing, where
shared memory access is beneficial.

#15. What is event loop


starvation, and how can it be
prevented?
Event loop starvation occurs when long-running
tasks block the event loop, preventing it from
handling other tasks. This can make your application
unresponsive.

https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 14/20
28/05/2025, 15:05 Node.js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery

Example of a blocking task:

while (true) {
// This loop blocks the event
}

How to prevent event loop starvation:

Use worker threads: Offload CPU-intensive tasks


to separate threads

Asynchronous operations: Avoid blocking the


event loop with synchronous methods

Split tasks into smaller chunks: Process large


tasks incrementally to allow the event loop to
handle other operations in between

By designing your application with non-blocking


principles, you can keep the event loop responsive
and ensure scalability.

#16. What are the differences


between
process.nextTick() and
setImmediate() ?
Both process.nextTick() and
setImmediate() schedule callbacks for
asynchronous execution, but they differ in when they
execute:

process.nextTick() : Executes callbacks at


the end of the current operation, before any I/O
events are processed

setImmediate() : Executes callbacks after I/O


events, as part of the check phase in the event
loop

For example:

process.nextTick(() => console.l


setImmediate(() => console.log('

https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 15/20
28/05/2025, 15:05 Node.js Interview Prep: Questions + Answers (With Code Examples) | Zero To Mastery
console.log('This runs before bo

Output:

This runs before both


This runs first
This runs second

Use process.nextTick() for tasks that need to


execute immediately after the current operation and
setImmediate() for tasks that can wait until the
I/O cycle completes.

#17. How do you debug memory


leaks in Node.js?
Memory leaks occur when memory that is no longer
needed is not released.

Common causes include unreferenced variables,


event listeners not being removed, or large objects
being unintentionally kept in scope.

Steps to debug memory leaks:

1. Monitor memory usage: Use


process.memoryUsage() to track memory
over time

2. Capture heap snapshots: Use Chrome DevTools


or the v8 module to analyze memory usage

3. Use diagnostic tools: Tools like clinic.js ,


memwatch , or node-inspect help identify
memory leaks

4. Remove unused listeners: Ensure event listeners


are properly removed using removeListener or
off

By proactively profiling and monitoring your


application, you can identify and fix memory leaks
before they impact performance.

How did you do?


https://zerotomastery.io/blog/node-js-interview-questions/#Advanced-Node-js-interview-questions 16/20
‭Section 1: Core Concepts‬

‭ . How does the event loop work in Node.js, and how is it different from the browser?‬
1
‭Explanation:‬
‭The Node.js event loop allows Node.js to perform non-blocking I/O operations despite‬
‭JavaScript being single-threaded. It does this by offloading operations to the system kernel or a‬
‭thread pool and running callbacks asynchronously in different phases.‬
‭Phases of the Node.js event loop:‬
setTimeout()‬‭and‬‭
‭1.‬ ‭Timers:‬‭Executes callbacks scheduled by‬‭ setInterval()‬ ‭.‬

‭2.‬ P
‭ ending Callbacks:‬‭Executes some system-level callbacks deferred to the next loop‬
‭iteration.‬

‭3.‬ ‭Idle, Prepare:‬‭Internal use for Node.js.‬

‭4.‬ ‭Poll:‬‭Retrieves new I/O events, executes I/O-related callbacks.‬

setImmediate()‬
‭5.‬ ‭Check:‬‭Executes callbacks scheduled by‬‭ ‭.‬

socket.on('close')‬
‭6.‬ ‭Close Callbacks:‬‭Handles close events like‬‭ ‭.‬

‭Difference from browsers:‬


‭●‬ ‭Browsers integrate event loop with rendering and user interaction, while Node.js event‬
‭loop is designed specifically for server-side asynchronous I/O.‬

‭●‬ ‭Browsers have extra phases related to UI rendering and user events.‬

‭●‬ ‭Node.js uses libuv under the hood for its event loop and asynchronous I/O handling.‬

‭ xample:‬
E
‭console.log('start');‬

‭setTimeout(() => console.log('setTimeout'), 0);‬

‭setImmediate(() => console.log('setImmediate'));‬

‭process.nextTick(() => console.log('nextTick'));‬

‭console.log('end');‬
‭ utput:‬
O
‭start‬
‭end‬
‭nextTick‬
‭setTimeout‬
‭setImmediate‬
‭2. Explain the difference between process.nextTick, setImmediate, and setTimeout.‬

‭●‬ ‭process.nextTick()‬

‭○‬ R
‭ uns callbacks immediately after the current operation completes, before the‬
‭event loop continues.‬

‭○‬ ‭Has higher priority than other timers or immediates.‬

‭○‬ ‭Used to defer work but still execute ASAP.‬

‭●‬ ‭setImmediate()‬

‭○‬ ‭Runs callbacks in the “check” phase of the event loop, after the poll phase.‬

‭○‬ ‭Executes after I/O events callbacks.‬

‭○‬ ‭Useful to run a callback immediately after I/O events.‬

‭●‬ ‭setTimeout(fn, 0)‬

‭○‬ S
‭ chedules callback to run in the timers phase, with a minimum delay (usually‬
‭around 1-2 ms).‬

‭○‬ ‭Used for scheduling delayed execution.‬

‭Summary:‬

‭Function‬ ‭Execution Time‬ ‭Phase in Event‬ ‭Priority‬


‭Loop‬

process.nextTi‬ A
‭ ‭ fter current operation, before event‬ ‭ efore event loop‬
B ‭Highest‬
ck‬
‭ ‭loop continues‬ ‭continues‬

setImmediate‬
‭ ‭After poll phase‬ ‭Check phase‬ ‭ ediu‬
M
‭m‬

setTimeout(...‬ ‭After minimum timer delay (~1-2ms)‬


‭ ‭Timers phase‬ ‭Lowest‬
,0)‬

‭3. What are worker threads, and when would you use them?‬

‭●‬ ‭Node.js is single-threaded by design.‬

‭●‬ ‭Worker threads‬‭enable multi-threading to run JavaScript‬‭code on different threads.‬

‭●‬ U
‭ seful for CPU-intensive tasks that would block the event loop (e.g., data processing,‬
‭image manipulation).‬

worker_threads‬‭module.‬
‭●‬ ‭You create a worker thread with the‬‭

‭●‬ ‭Communication between main thread and workers is via message passing.‬

‭Example usage:‬

‭const { Worker } = require('worker_threads');‬

‭const worker = new Worker('./worker.js');‬

‭worker.on('message', (msg) => {‬


‭console.log('Message from worker:', msg);‬
‭});‬

‭worker.postMessage('start');‬

‭4. How does Node.js handle asynchronous operations under the hood?‬

‭●‬ ‭Node.js relies on‬‭libuv‬‭, a C library that provides‬‭a thread pool and handles async I/O.‬

‭●‬ I‭/O operations like file reading, network calls are offloaded to the thread pool or system‬
‭kernel.‬

‭●‬ ‭When I/O completes, the callback is queued to the event loop for execution.‬

‭●‬ ‭JavaScript code remains single-threaded, but I/O can be concurrent.‬

‭●‬ ‭This architecture enables high concurrency with minimal threads.‬


‭5. Explain how garbage collection works in Node.js.‬

‭●‬ N
‭ ode.js uses the V8 JavaScript engine, which implements‬‭generational garbage‬
‭collection‬‭.‬

‭●‬ I‭t separates objects into‬‭young generation‬‭(newly‬‭created) and‬‭old generation‬


‭(long-lived).‬

‭●‬ ‭Minor GC: cleans young generation frequently (fast).‬

‭●‬ ‭Major GC: cleans old generation less frequently (more expensive).‬

‭●‬ ‭GC is incremental and mostly concurrent to reduce pause times.‬

‭●‬ ‭Developers should be mindful of memory leaks to avoid GC overhead.‬

‭6. How do streams work in Node.js, and when would you use them?‬

‭●‬ ‭Streams allow efficient handling of large data by processing it piece-by-piece.‬

‭●‬ ‭Types of streams:‬

‭○‬ ‭Readable‬‭: streams data out (e.g., reading files).‬

‭○‬ ‭Writable‬‭: streams data in (e.g., writing files).‬

‭○‬ ‭Duplex‬‭: readable and writable (e.g., TCP socket).‬

‭○‬ ‭Transform‬‭: modifies data as it passes through (e.g.,‬‭compression).‬

‭●‬ U
‭ se streams when working with large files, network data, or any data that shouldn't be‬
‭buffered fully in memory.‬

‭Example:‬

‭ onst fs = require('fs');‬
c
‭const readStream = fs.createReadStream('largefile.txt');‬
‭const writeStream = fs.createWriteStream('copy.txt');‬

‭readStream.pipe(writeStream);‬
‭7. What is the purpose of the cluster module, and how does it handle scaling?‬

‭●‬ T
‭ he cluster module allows creating multiple Node.js processes (workers) sharing the‬
‭same server port.‬

‭●‬ ‭Each worker runs in its own process and can handle a subset of requests.‬

‭●‬ ‭Useful to utilize multi-core CPUs and improve throughput.‬

‭●‬ ‭The master process manages workers and can restart them if they crash.‬

‭Example:‬

‭ onst cluster = require('cluster');‬


c
‭const http = require('http');‬
‭const numCPUs = require('os').cpus().length;‬

‭if (cluster.isMaster) {‬
‭for (let i=0; i<numCPUs; i++) {‬
‭cluster.fork();‬
‭}‬
‭} else {‬
‭http.createServer((req, res) => {‬
‭res.end('Hello world');‬
‭}).listen(8000);‬
‭}‬

‭That covers‬‭Core Concepts‬‭fully with examples!‬

‭If you’re good so far, I can proceed with‬‭Performance and Optimization‬‭section next.‬

‭Let me know!‬
‭Section 2: Performance and Optimization‬

‭1. How would you handle a memory leak in a Node.js application?‬

‭ nderstanding Memory Leaks:‬


U
‭Memory leaks happen when objects that are no longer needed are still referenced, preventing‬
‭garbage collection and leading to increased memory usage.‬

‭Common causes:‬

‭●‬ ‭Global variables‬

‭●‬ ‭Closures holding references unnecessarily‬

‭●‬ ‭Event listeners not removed‬

‭●‬ ‭Caches that grow indefinitely‬

‭How to detect:‬

‭●‬ ‭Use‬‭Chrome DevTools‬‭or‬‭Node.js Inspector‬‭for heap‬‭snapshots and memory profiling.‬

heapdump‬‭module to generate snapshots.‬


‭●‬ ‭Use tools like‬‭

process.memoryUsage()‬
‭●‬ ‭Monitor process memory with‬‭ ‭.‬

‭How to fix:‬

‭●‬ ‭Identify objects that are not released in heap snapshots.‬

‭●‬ ‭Remove event listeners when no longer needed (‬‭


emitter.removeListener‬
‭).‬

‭●‬ ‭Use weak references (‬‭


WeakMap‬ WeakSet‬
‭,‬‭ ‭) for caches.‬

‭●‬ ‭Limit cache sizes or implement eviction policies.‬

‭Example:‬

‭const EventEmitter = require('events');‬

‭const emitter = new EventEmitter();‬


‭function leak() {‬

‭const hugeArray = new Array(1000000).fill('leak');‬

‭emitter.on('event', () => {‬

‭console.log(hugeArray.length);‬

‭});‬

‭}‬

‭setInterval(leak, 1000);‬

‭2. How do you optimize a high-throughput API built with Node.js?‬

‭Best practices:‬

‭●‬ ‭Use clustering:‬‭Utilize all CPU cores by spawning‬‭workers (‬‭


cluster‬‭module).‬

‭●‬ ‭Leverage caching:‬‭Cache frequent data using Redis‬‭or in-memory cache with TTL.‬

‭●‬ A
‭ void blocking the event loop:‬‭Offload CPU-intensive‬‭tasks to worker threads or‬
‭external services.‬

‭●‬ ‭Use streams:‬‭Stream large payloads instead of buffering‬‭entire content in memory.‬

‭●‬ ‭Efficient database queries:‬‭Use indexes, limit query results, and optimize joins.‬

‭●‬ ‭Limit request size:‬‭Protect against large payloads‬‭causing memory exhaustion.‬

‭●‬ ‭Load balancing:‬‭Use reverse proxies (NGINX) or cloud load balancers.‬

‭●‬ ‭HTTP/2:‬‭Supports multiplexing for better concurrency.‬

‭●‬ ‭Connection pooling:‬‭For DB or external APIs to reduce overhead.‬

‭●‬ ‭Profiling:‬‭Use tools like Clinic.js or Node.js built-in profiler to find bottlenecks.‬
‭ . What are the trade-offs between child processes, worker threads, and‬
3
‭clustering?‬

‭Feature‬ ‭Child Processes‬ ‭Worker Threads‬ ‭Cluster Module‬

‭Architecture‬ ‭Separate processes‬ ‭ hreads within single‬


T ‭ ultiple processes‬
M
‭process‬ ‭sharing server‬

‭ ommunicatio‬
C ‭IPC via messaging‬ ‭ hared memory,‬
S ‭IPC messaging‬
‭n‬ ‭message passing‬

‭Use case‬ ‭ eavy CPU tasks,‬


H ‭ PU-intensive JS‬
C ‭ cale multi-core for‬
S
‭isolation needed‬ ‭tasks‬ ‭concurrency‬

‭Overhead‬ ‭Higher (full process)‬ ‭ ower (threads share‬


L ‭ oderate (process‬
M
‭memory)‬ ‭management)‬

‭Stability‬ ‭High, crashes isolated‬ ‭ hared memory can‬


S ‭Each worker isolated‬
‭cause issues‬

‭Complexity‬ ‭Moderate‬ ‭ ore complex (thread‬


M ‭ imple for HTTP‬
S
‭safety)‬ ‭servers‬
‭4. How would you monitor and profile a Node.js application in production?‬

‭Tools & methods:‬

winston‬
‭●‬ ‭Logging:‬‭Use structured logging (e.g.,‬‭ pino‬
‭,‬‭ ‭) for tracing.‬

‭●‬ ‭APM tools:‬‭New Relic, Datadog, AppDynamics offer deep profiling and monitoring.‬

node --inspect‬‭and‬‭Chrome DevTools.‬


‭●‬ ‭Node.js built-in profiler:‬‭Use‬‭

‭●‬ ‭Heap snapshots:‬‭To check memory usage and leaks.‬

‭●‬ ‭CPU profiling:‬‭To identify hotspots.‬

‭●‬ ‭Metrics:‬‭Use Prometheus or StatsD for tracking latency,‬‭throughput, errors.‬

‭●‬ ‭Health checks:‬‭Monitor app responsiveness.‬

‭●‬ ‭Error tracking:‬‭Use Sentry or Rollbar.‬

‭5. What are some ways to prevent blocking the event loop?‬

‭●‬ ‭Avoid heavy synchronous operations (e.g., large loops, sync file I/O).‬

‭●‬ ‭Offload CPU-heavy tasks to worker threads or child processes.‬

‭●‬ ‭Use asynchronous APIs consistently.‬

setImmediate‬‭or‬
‭●‬ ‭Break large computations into smaller chunks and schedule with‬‭
process.nextTick‬
‭ ‭.‬

‭●‬ ‭Use streaming APIs instead of buffering large data.‬

‭●‬ ‭Limit concurrency for tasks that can overload event loop.‬

‭That completes the‬‭Performance and Optimization‬‭section.‬

‭Shall I continue with the‬‭Security‬‭section next?‬


‭Section 3: Security‬

‭1. How do you prevent prototype pollution in a Node.js application?‬

‭ rototype pollution‬‭is an attack that modifies the prototype of built-in objects, leading to‬
P
‭unexpected behavior or security vulnerabilities.‬

‭ ow it happens:‬
H
{"__proto__":‬
‭If user input is merged into objects without validation, malicious input like‬‭
{"isAdmin": true}}‬‭can pollute prototypes.‬

‭Prevention:‬

‭‬
● ‭ alidate and sanitize user input before merging objects.‬
V
‭●‬ lodash‬‭with safe merge functions).‬
‭Use safe libraries (like‬‭
‭●‬ ‭Avoid merging user data directly into objects without checks.‬
‭●‬ fast-safe-set‬‭or‬‭
‭Use packages like‬‭ deepmerge‬‭that protect against prototype‬
‭pollution.‬
‭ ‬ ‭Upgrade dependencies regularly to patch vulnerabilities.‬

eval()‬‭pose in a Node.js‬‭app?‬
‭2. What security risks does‬‭

eval()‬‭executes a string as JavaScript code, which‬‭can run arbitrary code.‬


‭Risks:‬

‭‬ R
● ‭ emote code execution if user input is passed unchecked.‬
‭●‬ ‭Difficult to audit or secure.‬
‭●‬ ‭Can lead to injection attacks.‬

‭Best Practice:‬

eval()‬‭altogether.‬
‭●‬ ‭Avoid‬‭

‭●‬ ‭Use safer alternatives like JSON parsing or dedicated expression evaluators.‬
‭3. How do you handle JWT authentication securely in a Node.js backend?‬

‭Best practices:‬

‭●‬ ‭Use strong secret keys or RSA key pairs for signing tokens.‬

‭●‬ ‭Set appropriate expiration times (‬‭


exp‬‭claim).‬

‭●‬ ‭Validate tokens on every request.‬

‭●‬ ‭Use HTTPS to protect tokens in transit.‬

‭●‬ ‭Store tokens securely on client side (e.g., HTTP-only cookies).‬

‭●‬ ‭Implement token revocation or blacklist if needed.‬

‭●‬ ‭Avoid storing sensitive info in JWT payload (only non-sensitive claims).‬

jsonwebtoken‬‭and keep them updated.‬


‭●‬ ‭Use libraries like‬‭

‭ . How does CSRF protection work in an API, and do you need it in a‬


4
‭RESTful service?‬

‭ SRF (Cross-Site Request Forgery):‬‭Tricks authenticated‬‭users into submitting unwanted‬


C
‭requests.‬

‭Protection mechanisms:‬

‭‬ U
● ‭ se CSRF tokens (random tokens in forms/requests).‬
‭●‬ ‭Validate origin or referer headers.‬
‭●‬ ‭Use SameSite cookies to restrict cross-site usage.‬

‭Do RESTful APIs need CSRF protection?‬

‭●‬ T
‭ ypically, REST APIs used with tokens (like JWT in Authorization headers) are not‬
‭vulnerable to CSRF, since tokens aren’t automatically sent by browsers.‬

‭●‬ ‭CSRF protection is mainly needed for cookie-based auth sessions.‬


‭ . What are the best practices for handling sensitive data (e.g., env‬
5
‭variables, secrets management)?‬

‭●‬ ‭Do not hardcode secrets in code.‬

‭●‬ ‭Use environment variables or secure vaults (HashiCorp Vault, AWS Secrets Manager).‬

‭●‬ ‭Restrict access to secrets to minimum necessary.‬

‭●‬ ‭Rotate secrets regularly.‬

‭●‬ ‭Use encryption at rest and in transit.‬

‭●‬ ‭Avoid logging secrets.‬

.env‬‭files in development but exclude from version‬‭control.‬


‭●‬ ‭Use‬‭

dotenv‬‭to load env vars securely.‬


‭●‬ ‭Use tools like‬‭

‭That completes the‬‭Security‬‭section.‬

‭Would you like me to continue with the‬‭Code Challenges‬‭/ Hands-on Exercises‬‭section now?‬
just give for Code Challenges / Hands-on
Exercises
Here are detailed solutions for the Code Challenges / Hands-on Exercises section, each with a
clear explanation and code example.

1. Implement a Custom Promise Class from Scratch


Explanation:
A custom Promise class must manage three states (pending, fulfilled, rejected), store
callbacks, and support chaining via .then().
Example:

class MyPromise {
constructor(executor) {
this.state = 'pending';
this.value = undefined;
this.handlers = [];
const resolve = (value) => {
if (this.state !== 'pending') return;
this.state = 'fulfilled';
this.value = value;
this.handlers.forEach(h => h.onFulfilled(value));
};
const reject = (reason) => {
if (this.state !== 'pending') return;
this.state = 'rejected';
this.value = reason;
this.handlers.forEach(h => h.onRejected(reason));
};
try {
executor(resolve, reject);
} catch (err) {
reject(err);
}
}
then(onFulfilled, onRejected) {
return new MyPromise((resolve, reject) => {
const handler = {
onFulfilled: value => {
if (!onFulfilled) return resolve(value);
try {
resolve(onFulfilled(value));
} catch (err) {
reject(err);
}
},
onRejected: reason => {
if (!onRejected) return reject(reason);
try {
resolve(onRejected(reason));
} catch (err) {
reject(err);
}
}
};
if (this.state === 'fulfilled') handler.onFulfilled(this.value);
else if (this.state === 'rejected') handler.onRejected(this.value);
else this.handlers.push(handler);
});
}
}

2. Write a Function That Limits the Number of Concurrent Asynchronous Tasks


Running at a Time
Explanation:
This pattern is called a "concurrency limiter." It ensures that only a specified number of async
tasks run simultaneously.
Example:

function runWithConcurrencyLimit(tasks, limit) {


let i = 0, active = 0;
return new Promise(resolve => {
function next() {
if (i === tasks.length && active === 0) return resolve();
while (active < limit && i < tasks.length) {
active++;
tasks[i++]().then(() => {
active--;
next();
});
}
}
next();
});
}

// Usage:
const tasks = Array.from({length: 10}, (_, i) => () =>
new Promise(res => setTimeout(() => { console.log(i); res(); }, 1000))
);
runWithConcurrencyLimit(tasks, 3); // Only 3 tasks run at once
3. Build a Simple Rate Limiter Middleware in Express
Explanation:
A rate limiter restricts the number of requests a client can make in a given time window.
Example:

function rateLimiter(maxRequests, windowMs) {


const hits = {};
setInterval(() => Object.keys(hits).forEach(ip => hits[ip] = 0), windowMs);
return (req, res, next) => {
const ip = req.ip;
hits[ip] = (hits[ip] || 0) + 1;
if (hits[ip] > maxRequests) {
return res.status(429).send('Too many requests');
}
next();
};
}

// Usage with Express


const express = require('express');
const app = express();
app.use(rateLimiter(100, 60000)); // 100 requests per minute per IP

4. Implement an LRU Cache Using JavaScript


Explanation:
An LRU (Least Recently Used) cache evicts the least recently accessed item when the cache
exceeds its size limit. Use a Map for O(1) operations.
Example:

class LRUCache {
constructor(limit) {
this.limit = limit;
this.cache = new Map();
}
get(key) {
if (!this.cache.has(key)) return -1;
const value = this.cache.get(key);
this.cache.delete(key);
this.cache.set(key, value); // Move to end (most recently used)
return value;
}
put(key, value) {
if (this.cache.has(key)) this.cache.delete(key);
else if (this.cache.size === this.limit) {
this.cache.delete(this.cache.keys().next().value); // Remove LRU
}
this.cache.set(key, value);
}
}

// Usage
const lru = new LRUCache(2);
lru.put('a', 1);
lru.put('b', 2);
lru.get('a'); // 1
lru.put('c', 3); // 'b' is evicted
lru.get('b'); // -1

5. Create an Async Queue That Processes Tasks with a Given Concurrency Limit
Explanation:
An async queue processes tasks, but only up to concurrency at a time.
Example:

class AsyncQueue {
constructor(concurrency) {
this.concurrency = concurrency;
this.running = 0;
this.queue = [];
}
push(task) {
this.queue.push(task);
this.next();
}
next() {
if (this.running >= this.concurrency || !this.queue.length) return;
const task = this.queue.shift();
this.running++;
task().then(() => {
this.running--;
this.next();
});
this.next();
}
}

// Usage
const queue = new AsyncQueue(2);
for (let i = 0; i < 5; i++) {
queue.push(() => new Promise(res => setTimeout(() => {
console.log('Task', i);
res();
}, 1000)));
}

These challenges are fundamental for Node.js interviews and backend engineering roles,
demonstrating deep understanding of asynchronous programming, middleware, and efficient
data structures [1] [2] [3] .

1. https://www.softlogicsys.in/node-js-coding-challenges-with-solutions-for-beginners/
2. https://dev.to/snyk/essential-nodejs-backend-examples-for-developers-in-2024-2j00
3. https://zerotomastery.io/blog/node-js-interview-questions/

You might also like