In Node.js, streams are a powerful way to handle data in a continuous, chunk-by-chunk manner. Instead of loading an entire file or data source into memory, streams process data as it becomes available, making them highly efficient for large files, network requests, and other I/O operations.

There are four main types of streams:

  • Readable: A stream from which you can read data. Examples include fs.createReadStream() for files and the http.IncomingMessage object on a server.

  • Writable: A stream to which you can write data. Examples include fs.createWriteStream() for files and the http.ServerResponse object for a server’s response.

  • Duplex: A stream that is both Readable and Writable. An example is a net.Socket.

  • Transform: A Duplex stream that can modify or transform data as it is written and read. A good example is the zlib module for compression and decompression.

How to Use Streams

The most common and recommended way to work with streams is using the pipe() method. Piping connects a readable stream to a writable stream, so data flows from one to the other automatically.

Example 1: Reading and Writing Files

This is a classic example of using streams to copy a large file without consuming excessive memory.

const fs = require('fs');

// Create a readable stream from the source file
const readableStream = fs.createReadStream('source.txt');

// Create a writable stream to the destination file
const writableStream = fs.createWriteStream('destination.txt');

// Pipe the readable stream to the writable stream
// This will read data from source.txt and write it to destination.txt
readableStream.pipe(writableStream);

// You can also listen for events to know when the operation is complete
writableStream.on('finish', () => {
  console.log('File copy complete.');
});

Example 2: Using Events (Low-Level Approach)

While pipe() is the easiest method, you can also use events to handle streams. This gives you more fine-grained control over the data flow.

Readable Stream Events:

  • data: Fired when a chunk of data is ready to be read.

  • end: Fired when there is no more data to be read.

  • error: Fired if an error occurs.

Writable Stream Events:

  • finish: Fired when the end() method has been called, and all data has been written.

  • error: Fired if an error occurs.

Here’s how you would manually handle a readable stream and pipe it to a writable stream using events:

const fs = require('fs');

const readableStream = fs.createReadStream('source.txt');
const writableStream = fs.createWriteStream('destination.txt');

// Read data chunk by chunk
readableStream.on('data', (chunk) => {
  // Write the chunk to the writable stream
  writableStream.write(chunk);
});

// When the readable stream ends, close the writable stream
readableStream.on('end', () => {
  writableStream.end();
});

// Handle errors
readableStream.on('error', (err) => {
  console.error('Error reading from file:', err);
});
writableStream.on('error', (err) => {
  console.error('Error writing to file:', err);
});

When to Use Streams

Use streams in Node.js whenever you are dealing with large amounts of data or when the data is not immediately available, such as:

  • Reading or writing large files.

  • Handling data from network requests (HTTP, TCP sockets).

  • Processing compressed data using zlib.

  • Working with data from other processes.

Ads